Harmonic Source Localization Approach Based on Fast Kernel Entropy Optimization ICA and Minimum Conditional Entropy
نویسندگان
چکیده
Abstract: Based on the fast kernel entropy optimization independent component analysis and the minimum conditional entropy, this paper proposes a harmonic source localization method which aims at accurately estimating harmonic currents and identifying harmonic sources. The injected harmonic currents are estimated by the fast kernel entropy optimization independent component analysis (FKEO-ICA) in the absence of prior knowledge of harmonic impedances. Then, the minimum conditional entropy is applied to locate the harmonic sources based on the estimated harmonic currents. The proposed harmonic source localization method is validated on the IEEE 34-bus system. By applying the correlation coefficient and three error evaluation indicators, comparison has been made among the performances of the FKEO-ICA and three other ICA algorithms. The results show that the FKEO-ICA algorithm could achieve a significantly better accuracy of harmonic current estimation, while the minimum conditional entropy could determine the locations of harmonic sources precisely.
منابع مشابه
ISAR Image Improvement Using STFT Kernel Width Optimization Based On Minimum Entropy Criterion
Nowadays, Radar systems have many applications and radar imaging is one of the most important of these applications. Inverse Synthetic Aperture Radar (ISAR) is used to form an image from moving targets. Conventional methods use Fourier transform to retrieve Doppler information. However, because of maneuvering of the target, the Doppler spectrum becomes time-varying and the image is blurred. Joi...
متن کاملMinimax Mutual Information Approach for Independent Component Analysis
Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estima...
متن کاملFast kernel entropy estimation and optimization
Differential entropy is a quantity used in many signal processing problems. Often we need to calculate not only the entropy itself, but also its gradient with respect to various variables, for efficient optimization, sensitivity analysis, etc. Entropy estimation can be based on an estimate of the probability density function, which is computationally costly if done naively. Some prior algorithm...
متن کاملICA Using Kernel Entropy Estimation with NlogN Complexity
Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing nonparametric algorithms suffer from high complexity, particularly i...
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 18 شماره
صفحات -
تاریخ انتشار 2016